منابع مشابه
Chernoff Bounds
If m = 2, i.e., P = (p, 1 − p) and Q = (q, 1 − q), we also write DKL(p‖q). The Kullback-Leibler divergence provides a measure of distance between the distributions P and Q: it represents the expected loss of efficiency if we encode an m-letter alphabet with distribution P with a code that is optimal for distribution Q. We can now state the general form of the Chernoff Bound: Theorem 1.1. Let X1...
متن کاملLoad Balancing and Chernoff Bounds
This week, we consider a very simple load-balancing problem. Suppose you have n machines and m jobs. You want to assign the jobs to machines such that all machines have approximately the same load. Of course, there is a solution with load at most dmn e on every machine, but that requires central coordination. Without central coordination, the easiest thing you can do is let each job drawn one m...
متن کاملChernoff Bounds, and Some Applications
Preliminaries Before we venture into Chernoff bound, let us recall two simple bounds on the probability that a random variable deviates from the mean by a certain amount: Markov's inequality and Chebyshev's inequality. Markov's inequality only applies to non-negative random variables and gives us a bound depending on the expectation of the random variable.
متن کاملLecture 2: Matrix Chernoff bounds
The purpose of my second and third lectures is to discuss spectral sparsifiers, which are the second key ingredient in most of the fast Laplacian solvers. In this lecture we will discuss concentration bounds for sums of random matrices, which are an important technical tool underlying the simplest sparsifier construction.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Bernoulli
سال: 2014
ISSN: 1350-7265
DOI: 10.3150/12-bej484